Neural Nets via Forward State Transformation and Backward Loss Transformation
نویسندگان
چکیده
This article studies (multilayer perceptron) neural networks with an emphasis on the transformations involved — both forward and backward — in order to develop a semantical/logical perspective that is in line with standard program semantics. The common two-pass neural network training algorithms make this viewpoint particularly fitting. In the forward direction, neural networks act as state transformers. In the reverse direction, however, neural networks change losses of outputs to losses of inputs, thereby acting like a (real-valued) predicate transformer. In this way, backpropagation is functorial by construction, as shown earlier in recent other work. We illustrate this perspective by training a simple instance of a neural network.
منابع مشابه
MATHEMATICAL ENGINEERING TECHNICAL REPORTS Type Specialization for Effective Bidirectionalization
A bidirectional transformation is a pair of transformations, a forward transformation and a backward transformation, where a forward transformation maps one data structure called source to another called view, and a corresponding backward transformation reflects changes on the view to the source. Its practical applications include replicated data synchronization, presentation-oriented editor de...
متن کاملMonotonic Extensions of Petri Nets: Forward and Backward Search Revisited
In this paper, we revisit the forward and backward approaches to the verification of extensions of infinite state Petri Nets. As contributions, we propose an efficient data structure to represent infinite downward closed sets of markings and to compute symbolically the minimal coverability set of Petri Nets, we identify a subclass of Transfer Nets for which the forward approach generalizes and ...
متن کاملCombining Forward and Backward Abstract Interpretation of Horn Clauses
Alternation of forward and backward analyses is a standard technique in abstract interpretation of programs, which is in particular useful when we wish to prove unreachability of some undesired program states. The current state-ofthe-art technique for combining forward (bottom-up, in logic programming terms) and backward (top-down) abstract interpretation of Horn clauses is query-answer transfo...
متن کاملFormal Specification of Model Transformations by Triple Graph Grammars with Application Conditions
Triple graph grammars are a successful approach to describe exogenous model transformations, i.e. transformations between models conforming to different meta-models. Source and target models are related by some connection part, triple rules describe the simultaneous construction of these parts, and forward and backward rules can be derived modeling the forward and backward model transformations...
متن کاملGlobal Solar Radiation Prediction for Makurdi, Nigeria Using Feed Forward Backward Propagation Neural Network
The optimum design of solar energy systems strongly depends on the accuracy of solar radiation data. However, the availability of accurate solar radiation data is undermined by the high cost of measuring equipment or non-functional ones. This study developed a feed-forward backpropagation artificial neural network model for prediction of global solar radiation in Makurdi, Nigeria (7.7322 N lo...
متن کامل